2 research outputs found

    Exact Common Information

    Full text link
    This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables (X,Y)(X,Y). We introduce the quantity G(X;Y)=min⁑Xβ†’Wβ†’YH(W)G(X;Y)=\min_{X\to W \to Y} H(W) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS (X,Y)(X,Y). We give a multiletter characterization for it as the limit GΛ‰(X;Y)=lim⁑nβ†’βˆž(1/n)G(Xn;Yn)\bar{G}(X;Y)=\lim_{n\to \infty}(1/n)G(X^n;Y^n). While in general GΛ‰(X;Y)\bar{G}(X;Y) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general
    corecore